651 research outputs found
â68â als politische Herausforderung.: Der Einfluss der 68er-Bewegung auf den Wandel der politischen Parteien in Frankreich und in der Bundesrepublik
International audienc
Numerical investigations of fault propagation and forced-fold using a non smooth discrete element method
Geophysical problems as forced-fold evolution and fault propagation induce large deformations and many localisation. The continuum mechanics does not seem the more appropriate for their description and it appears more interesting to represent the media as initially discontinuous. To face both phenomena, a non smooth Discrete Element Method is used. Geophysical structures are considered as collection of rigid disks which interact by cohesive frictional contact laws. Numerical geophysical formations are correlated to mechanical properties of structures through observation and mechanical analysis.Les problĂšmes gĂ©ophysiques tels que lâĂ©volution des plis et la propagation de failles induisent de grandes dĂ©formations et de nombreuses localisations. Il apparaĂźt donc difficile de dĂ©crire le problĂšme avec les outils de la mĂ©canique des milieux continus, et il est donc preferable de reprĂ©senter la structure comme initialement divisĂ©e. Ces deux phĂ©nomĂšnes sont Ă©tudiĂ©s via une approche non rĂ©guliĂšre par Ă©lĂ©ments discrets. Les structures gĂ©ologiques sont considĂ©rĂ©es comme des collections de particules dont les interactions rĂ©pondent Ă des lois de contact cohĂ©sif frottant. Les observations des structures gĂ©ophysiques numĂ©riques sont corrĂ©lĂ©es aux propriĂ©tĂ©s des structures au travers dâune analyse mĂ©canique
Optimal diversification in the presence of parameter uncertainty for a risk averse investor
We consider an investor who faces parameter uncertainty in a continuoustime financial market. We model the investor's preference by a power utility function leading to constant relative risk aversion. We show that the loss in expected utility is large when using a simple plug-in strategy for unknown parameters. We also provide theoretical results that show the trade-off between holding a well-diversified portfolio and a portfolio that is robust against estimation errors. To reduce the effect of estimation, we constrain the weights of the risky assets with an L1-norm leading to a sparse portfolio. We provide analytical results that show how the sparsity of the constrained portfolio depends on the coefficient of relative risk aversion. Based on a simulation study, we demonstrate the existence and the uniqueness of an optimal bound on the L1-norm for each level of relative risk aversion
Optimal diversification in the presence of parameter uncertainty for a risk averse investor
We consider an investor who faces parameter uncertainty in a continuoustime financial market. We model the investor's preference by a power utility function leading to constant relative risk aversion. We show that the loss in expected utility is large when using a simple plug-in strategy for unknown parameters. We also provide theoretical results that show the trade-off between holding a well-diversified portfolio and a portfolio that is robust against estimation errors. To reduce the effect of estimation, we constrain the weights of the risky assets with an L1-norm leading to a sparse portfolio. We provide analytical results that show how the sparsity of the constrained portfolio depends on the coefficient of relative risk aversion. Based on a simulation study, we demonstrate the existence and the uniqueness of an optimal bound on the L1-norm for each level of relative risk aversion
Continuation of Nesterov's Smoothing for Regression with Structured Sparsity in High-Dimensional Neuroimaging
Predictive models can be used on high-dimensional brain images for diagnosis
of a clinical condition. Spatial regularization through structured sparsity
offers new perspectives in this context and reduces the risk of overfitting the
model while providing interpretable neuroimaging signatures by forcing the
solution to adhere to domain-specific constraints. Total Variation (TV)
enforces spatial smoothness of the solution while segmenting predictive regions
from the background. We consider the problem of minimizing the sum of a smooth
convex loss, a non-smooth convex penalty (whose proximal operator is known) and
a wide range of possible complex, non-smooth convex structured penalties such
as TV or overlapping group Lasso. Existing solvers are either limited in the
functions they can minimize or in their practical capacity to scale to
high-dimensional imaging data. Nesterov's smoothing technique can be used to
minimize a large number of non-smooth convex structured penalties but
reasonable precision requires a small smoothing parameter, which slows down the
convergence speed. To benefit from the versatility of Nesterov's smoothing
technique, we propose a first order continuation algorithm, CONESTA, which
automatically generates a sequence of decreasing smoothing parameters. The
generated sequence maintains the optimal convergence speed towards any globally
desired precision. Our main contributions are: To propose an expression of the
duality gap to probe the current distance to the global optimum in order to
adapt the smoothing parameter and the convergence speed. We provide a
convergence rate, which is an improvement over classical proximal gradient
smoothing methods. We demonstrate on both simulated and high-dimensional
structural neuroimaging data that CONESTA significantly outperforms many
state-of-the-art solvers in regard to convergence speed and precision.Comment: 11 pages, 6 figures, accepted in IEEE TMI, IEEE Transactions on
Medical Imaging 201
Simulateur compilĂ© dâune description multi-langage des systĂšmes hĂ©tĂ©rogĂšnes
La conception de systÚmes hétérogÚnes exige deux étapes importantes, à savoir : la modélisation
et la simulation. Habituellement, des simulateurs sont reliés et synchronisés
en employant un bus de co-simulation. Les approches courantes ont beaucoup dâinconvĂ©nients
: elles ne sont pas toujours adaptées aux environnements distribués, le temps
dâexĂ©cution de simulation peut ĂȘtre trĂšs dĂ©cevant, et chaque simulateur a son propre
noyau de simulation. Nous proposons une nouvelle approche qui consiste au développement
dâun simulateur compilĂ© multi-langage oĂč chaque modĂšle peut ĂȘtre dĂ©crit en
employant différents langages de modélisation tel que SystemC, ESyS.Net ou autres.
Chaque modÚle contient généralement des modules et des moyens de communications
entre eux. Les modules décrivent des fonctionnalités propres à un systÚme souhaité. Leur
description est rĂ©alisĂ©e en utilisant la programmation orientĂ©e objet et peut ĂȘtre dĂ©crite
en utilisant une syntaxe que lâutilisateur aura choisie. Nous proposons ainsi une sĂ©paration
entre le langage de modélisation et la simulation. Les modÚles sont transformés en
une mĂȘme reprĂ©sentation interne qui pourrait ĂȘtre vue comme ensemble dâobjets. Notre
environnement compile les objets internes en produisant un code unifiĂ© au lieu dâutiliser
plusieurs langages de modélisation qui ajoutent beaucoup de mécanismes de communications
et des informations supplémentaires. Les optimisations peuvent inclure différents
mécanismes tels que le regroupement des processus en un seul processus séquentiel tout
en respectant la sĂ©mantique des modĂšles. Nous utiliserons deux niveaux dâabstraction
soit le « register transfer level » (RTL) et le « transaction level modeling » (TLM). Le
RTL permet une modĂ©lisation Ă bas niveau dâabstraction et la communication entre les
modules se fait Ă lâaide de signaux et des signalisations. Le TLM est une modĂ©lisation
dâune communication transactionnelle Ă un plus haut niveau dâabstraction. Notre objectif
est de supporter ces deux types de simulation, mais en laissant Ă lâusager le choix du
langage de modĂ©lisation. De mĂȘme, nous proposons dâutiliser un seul noyau au lieu de
plusieurs et dâenlever le bus de co-simulation pour accĂ©lĂ©rer le temps de simulation.The design of heterogeneous systems requires two main steps, modeling and simulation.
Usually, simulators are connected and synchronized by using a cosimulation bus. These
current approaches have many disadvantages: they are not always adapted to the distributed
environments, the execution time can be very disappointing, and each simulator
has its own core of simulation. We propose a new approach which consists in developing
a multi-language compiled simulator where each model can be described by employing
various modeling languages such as SystemC, ESyS.Net or others. Each model contains
modules and communication links between them. These modules describe functionalities
for a desired system. Their description is realized by using the programming object
and can be described by using a syntax that a user will have chosen.
We thus propose a separation between the language of modeling and simulation. The
models are transformed into the same internal representation which could be seen like
unique objects. Our environment compiles these internal objects by producing a unified
code instead of using several languages of modeling which add many mechanisms of
communications and extra informations. Optimizations can include various mechanisms
such as merging processes into only one sequential process while respecting the semantics
of the models. We will use two abstraction levels, the âregister transfer levelâ(RTL)
and the âtransaction-level modelingâ(TLM). RTL allows a low level abstraction for modeling
and the communication between the modules is done with signals. The TLM is a
modeling for transactional communication with a higher abstraction level than RTL. Our
aim is to support these two types of simulation, but the user can choose the language of
modeling. In the same way, we propose to use a single core and to remove the cosimulation
bus to accelerate the simulation time
Développement de techniques analytiques pour l'évaluation des protéines thérapeutiques et des biomarqueurs par spectrométrie de masse
Although liquid chromatography coupled to mass spectrometry is a commonly used technique for low molecular weigh compounds quantification in biological matrixes, it is no longer the case for macromolecules. The aim of this work was to evaluate immunoaffinity extraction coupled to mass spectrometry possibilities and difficulties in recombinant proteins and biomarkers quantification. This approach was first applied for absolute quantification of an intact small therapeutic protein, Epi-hNE4, an inhibitor of human neutrophil elastase. Immunoaffinity extraction from plasma led to a lower limit of quantification of 80 pM. The method was then applied to a therapeutic monoclonal antibody used for the treatment of colorectal cancer, Cetuximab. A lower limit of quantification of 130 pM was achieved through an extraction from plasma involving recognition of its biological target, EGFR. This last development offer possibilities for the immunogenicity testing of therapeutic proteins. The golden standard remains ELISA technique, which we implemented in a screening test for immunogenicity testing of Epi-hNE4, but difficulties encountered suggested that mass spectrometry is a potential alternative. Finally, multiplexed quantitative mass spectrometry was applied to apelins, a peptide biomarkers family between 12 and 36 amino acids, involved in cardiovascular functions. A limit of quantification of 25 pM was achieved, and interestingly, this method brought evidence of the absence of presupposed circulating forms.Si la chromatographie liquide couplĂ©e Ă la spectromĂ©trie de masse est un outil de bioanalyse largement utilisĂ© pour la quantification des molĂ©cules de faible poids molĂ©culaire dans les fluides biologiques, il n'en est pas de mĂȘme pour les macromolĂ©cules. Au cours de ce travail, nous avons explorĂ© les possibilitĂ©s et les difficultĂ©s d'une mĂ©thode d'extraction par immunoaffinitĂ© couplĂ©e de la spectromĂ©trie de masse pour la quantification des protĂ©ines recombinantes et des biomarqueurs. Cette stratĂ©gie a tout d'abord Ă©tĂ© appliquĂ©e Ă une petite protĂ©ine thĂ©rapeutique, Epi-hNE4, un inhibiteur de l'Ă©lastase humaine et pour lequel nous avons obtenu une sensibilitĂ© de 80 pM. Cette mĂ©thode a Ă©galement Ă©tĂ© appliquĂ©e Ă un anticorps thĂ©rapeutique utilisĂ© pour le traitement du cancer colorectal, Cetuximab, pour lequel une sensibilitĂ© de130 pM a Ă©tĂ© obtenue grĂące Ă une extraction par l'EGFR, sa cible biologique. Ce dernier dĂ©veloppement offre des perspectives pour Ă©valuer l'immunogĂ©necitĂ© des protĂ©ines thĂ©rapeutiques. La mĂ©thode de rĂ©fĂ©rence est la technique ELISA, que nous avons appliquĂ©e Ă la dĂ©tection d'anticorps anti-Epi-hNE4, mais les difficultĂ©s rencontrĂ©es suggĂšrent que la spectromĂ©trie de masse serait une alternative interessante. Enfin, une derniĂšre application de la spectromĂ©trie de masse pour la quantification multiplexĂ©e des biomarqueurs a Ă©tĂ© menĂ©e sur les apelines, une famille de peptides de 12 Ă 36 acides aminĂ©s jouant un rĂŽle crucial dans le systĂšme cardiovasculaire. Une limite de dĂ©tection de l'ordre de 25 pM a Ă©tĂ© atteinte, et de façon interessante, cette mĂ©thode nous a permis de conclure Ă l'absence des formes supposĂ©es circulantes
Die liberale Kraft Europas: Die Soziale Marktwirtschaft in der Europapolitik der Bundesrepublik, 1953-1993
Nach dem Zweiten Weltkrieg entschied sich Europa fĂŒr den Liberalismus und wĂ€hlte dafĂŒr eine spezifische Wirtschafts- und Sozialordnung: die Soziale Marktwirtschaft. Mathieu Dubois zeichnet den Einfluss der Bundesrepublik Deutschland und des Ordoliberalismus auf dieses Modell nach. Seine These: Im Gegensatz zum Bild eines "politischen Zwergs" erwies sich die westdeutsche Europapolitik als entscheidender Vermittler europĂ€ischer Kompromisse. Damit leistete die Bundesrepublik einen wichtigen Beitrag zur Liberalisierung des Kontinents und zur GrĂŒndung einer Wirtschafts- und StabilitĂ€tsgemeinschaft - aber auch zu einer WĂ€hrungsunion, die heute durch Zwang funktioniert
Topics in portfolio optimisation and systemic risk
This thesis is concerned with different sources of risk occurring in financial markets. We follow a bottom-up approach by carrying out an analysis from the perspective of a single investor to the whole banking system.
We first consider an investor who faces parameter uncertainty in a continuous-time financial market. We model the investorâs preference by a power utility function leading to constant relative risk aversion. We show that the loss in expected utility is large when using a simple plug-in strategy for unknown parameters. We also provide theoretical results that show the tradeoff between holding a well-diversified portfolio and a portfolio that is robust against estimation errors. To reduce the effect of estimation, we constrain the weights of the risky assets with a norm leading to a sparse portfolio. We provide analytical results that show how the sparsity of the constrained portfolio depends on the coefficient of relative risk aversion. Based on a simulation study, we demonstrate the existence and the uniqueness of an optimal bound on the norm for each level of relative risk aversion.
Next, we consider the interbank lending market as a network in which the nodes represent banks and the directed edges represent interbank liabilities. The interbank network is characterised by the matrix of liabilities whose entries are not directly observable, as bank balance sheets provide only total exposures to the interbank market. Using a Bayesian approach, we assume that each entry follows a Gamma distributed prior. We then construct a Gibbs sampler of the conditional joint distribution of interbank liabilities given total interbank liabilities and total interbank assets. We illustrate our methodology with a stress test on two networks of eleven and seventysix banks. We identify under which circumstances the choice of the prior influences the stability and the structure of the network
- âŠ